Measurement Terminology

Definition:

  1. Measurand - The physical quantity being measured.

The elements of a generalized measurement system are

  • sensing element
  • signal modification
  • indicator
Figure 1: Generalized Measurement System Components

Definition:

  1. Error - The difference between a measurand's measured value and its true value.

There will always be some deviation between the actual value and the measurements system's output

Is this ok?

yes, as long as the deviation is acceptable for the experiment's intended purpose

$$ error = \mid measured\ value - true\ value \mid $$
Figure 2: Error Represented On A Number Line

Definition:

  1. Systematic Error - An error that occurs every time a measurement is taken.

sometimes called a fixed or bias error

Mathematically, systematic error is

$$ systematic\ error = \mid average - true\ value \mid $$
Figure 3: Systematic Error of Repeated Measurements

Definition:

  1. Random Error - An error caused by the lack of repeatability in a measurement.

sometimes called a precision error

may be reduced by averaging many measurements

Figure 4: Range of Random Error

For a single measurement, the random error is given by

$$ random\ error = \mid specific\ reading - average \mid$$

Common Sources of Systematic Errors

Definition:

  1. Calibration Error - Errors introduced by the calibration of a measurement device.

This type of error can arise from

  • inaccuracy of "known" values
  • nonlinearity of the output of the sensor being calibrated
Figure 5: Nonlinear Sensor Output

Definition:

  1. Loading Error - Error due to the premature sampling of a system when the system has experienced a change of some sort.

Definition:

  1. Spatial Error - Error due to an underlying assumption that there is uniformity spatially.
Figure 6: Spatial Error Example

Will $T_1$ and $T_2$ have the same value?

Example Problem 1

Given:

In a calibration test, measurements using a digital voltmeter have been made of a battery voltage known to have an actual voltage of $6.11V$. The readings are:

$5.98V$, $6.05V$, $6.15V$, $6.06V$, $5.99V$, $6.00V$, $6.08V$, $6.03V$ and $6.11V$.

Required:

For the given data

(a) Estimate the systematic error of the voltmeter.

(b) Estimate the maximum random error of the voltmeter.

Solution:

In [1]:
V_t = 6.11 # in V
In [2]:
data = [5.98, 6.05, 6.15, 6.06, 5.99, 6.00, 6.08, 6.03, 6.11] # in V
In [3]:
import numpy as np
In [4]:
V_avg = np.mean(data)
print(V_avg)
6.050000000000001
In [5]:
B = abs(V_avg - V_t)
print(B)
0.05999999999999961
In [6]:
print(round(B, 2)) # answer to part (a)
0.06

Part (a): The systematic error of the data is $0.06 V$.


In [7]:
E = [abs(min(data)-V_avg), abs(max(data)-V_avg)]
print(E)
[0.07000000000000028, 0.09999999999999964]
In [8]:
P = max(E)
print(P)
0.09999999999999964
In [9]:
print(round(P, 2))
0.1

Part (b): The maximum random error of the data is $0.1 V$.


Discussion:

It should be noted that the maximum random error does not do an adequate job of describing the random error of the system. A single bad reading could skew the results. We'll use statistical methods to handle these problems later in the course.


Definition:

  1. Range - Values of the measurand to which the measuring system will respond adequately.
Figure 7: Tachometer

Definition:

  1. Accuracy - The closeness of the measurement value to the true value.

often specified in terms of the full scale

generally includes both residual systematic and random errors

Figure 8: Voltmeter Output

rule of thumb, select instruments where the measurand will fall in the middle to upper portions of the instrument range

Definition:

  1. Precision - The trait of a measuring system with a small random error.
Figure 9: Accuracy and Precision

Case 1 is not accurate and precise

Case 2 is accurate but not precise

Case 3 is precise but not accurate

Case 4 is accurate and precise

Definition:

  1. Resolution - The smallest measurand value that can be determined using the instrument.
Figure 10: Multimeter Resolution

Definition:

  1. Resolution Error - The error produced by the instrument's inability to follow changes in the measurand strictly.

Definition:

  1. Readability - The ability of how well a non-digital instrument may be read by an operator.

best practice is to read the instrument to the closest graduation mark

Fun fact: The human eye has trouble reconciling anything below $0.01in$

Figure 11: Tape Measure Readability

Definition:

  1. Repeatability - The ability of an instrument to produce the same output reading for a given measurand.

Repeatability produces a random error known as a repeatability error

Definition:

  1. Linearity - The instrument's ability to produce a linearly proportional output to the measurand experienced.

linearity error is produced when an instrument does not behave linearly

Figure 12: Nonlinear Instrument

The amount that the actual output is off at $0\%$ of the measurand's full scale is the zero offset

Definition:

  1. Sensitivity - The amount of change in the output of an instrument per unit change of the input.

Mathematically, this may be written as

$$ sensitivity = \frac{d\left( output \right)}{d\left( input \right)} \approx \frac{\Delta output}{\Delta input} $$

For example, the ADXL335 accelerometer has a sensitivity of $300mV/g$

Example Problem 2

Given:

An angular velocity measuring device (tachometer) can measure a mechanical shaft's speed in the range from $0$ to $5000 rpm$. It has an accuracy of $\pm 5\%$ of full scale. You notice that when the shaft speed is zero, the device has an average reading of $200 rpm$.

Required:

What is the maximum error that you might estimate in reading a shaft speed of $3500 rpm$?

Solution:

In [10]:
R = [0, 5000] # in rpm
In [11]:
Ap = 5 # as a percentage
In [12]:
Z = 200 # in rpm
In [13]:
S = 3500 # in rpm
In [14]:
A_e = (Ap/100)*(R[1]-R[0])
print(A_e)
250.0
In [15]:
M = A_e + Z
print(M)
450.0

The maximum expected error is $450 rpm$.